Geometry of Least Squares Estimator

#econometrics #economics

Oh, Hyunzi. (email: wisdom302@naver.com)
Korea University, Graduate School of Economics.
2024 Spring, instructed by prof. Kim, Dukpa.


Main References

  • Kim, Dukpa. (2024). "Econometric Analysis" (2024 Spring) ECON 518, Department of Economics, Korea University.
  • Davidson and MacKinnon. (2021). "Econometric Theory and Methods", Oxford University Press, New York.

Model

In vector form, the model is
for all . where Alternatively, where for ,

Equivalently, in matrix form, the model is where Therefore, is a linear combination of the columns of .

Remark (linear combination).

Consider a set of vectors . We say that spans if for all vectors , can be written as a linear combination of , i.e.

Remark ( as a linear transformation).

Let a linear model be where and . Then, an matrix can be considered as a linear transformation from to , since it maps to another vector .

Ordinary Least Squares

Proposition (Ordinary Least Squares estimator of ).

Given model the (ordinary) Least Squares estimator for is which is called normal equation and is driven by

Proof.We first solve using the first order condition, and later we solve it using the concept of orthogonal complement, and the Method of Moments.

  1. using F.O.C.
    • From the definition, where the last equation holds by , since they are both scalar.
    • F.O.C.
  2. using Orthogonal Projection
  3. using MOM
    • Given the assumption that and the true model , we have
    • By replacing the population mean into the sample mean, we have which is the exactly same results.

This completes the proof.

Orthogonal Projections

Using Proposition 3 (Ordinary Least Squares estimator of ), we can decompose into two parts, where and . and is called fitted value and is called residual.

Properties of Projection Matrices

Proposition (symmetric and idempotent of projection matrices).

The projection matrices and are both symmetric and idempotent.

Proof.First we show that is symmetric idempotent.

  • is symmetric:
  • is idempotent:

Next we show that is symmetric idempotent.

  • is symmetric:
  • is idempotent: This completes the proof.
Lemma (rank of ).

For a matrix of dimension and , we have

Proof.Since is dimension, and , we have note that the first equation holds by Introductory Linear Algebra > Remark 13 (rank of symmetric idempotent). Therefore, we have .

Remark (projection results).

For and , we have

Proof.First, and this completes the proof.

The result of Remark 6 (projection results) will be further analysed in the section Orthogonal Projections.

Proposition (complementary projections).

Let and . Then the two projections are complementary. i.e.

Proof.The results comes trivially, This completes the proof.

Exercise (ch1.6. anniliate projection).

Let and . Then the two projections are annihilate each other. i.e.

Proof.First we prove . where the third equality holds by Proposition 4 (symmetric and idempotent of projection matrices).

Next we prove . This completes the proof.


Orthogonal Projections

Lemma (range space of projection mapping).

For the projection mappings and , we have

Proof.First we show .
() First assume . Then by Introductory Linear Algebra > Definition 7 (range or column space), Thus we have since , we have .

() Assume . Then similarly, Then we have thus , i.e. .

Next we show .
() First assume . Then by Introductory Linear Algebra > Definition 2 (orthogonal complement), Since this holds for every , by letting , we have which leads to Thus we have Therefore, there exists some such that . i.e. .

() Lastly, assume . meaning that Then, by definition, Therefore, for all , we have where the last equality holds since as This completes the proof.

Proposition (projection mapping).

Let Then, is an orthogonal projection mapping onto and is an orthogonal projection mapping onto .

Proof.Since Lemma 9 (range space of projection mapping) shows that by Introductory Linear Algebra > Theorem 16 (orthogonal projection is symmetric and idempotent), it is sufficient to show that the both and are symmetric and idempotent, which has been shown by Proposition 4 (symmetric and idempotent of projection matrices).
Therefore, is an orthogonal projection onto and is an orthogonal projection onto .

Understanding projection matrices

The matrix maps any given vector in onto the rage space of by projecting orthogonally. However, the matrix maps any given vector onto the orthogonal complement of the range space of , by projecting orthogonally. Thus and are orthogonal, i.e. .


Eigenvalue Decomposition

Corollary (eigen-decomposition of projection matrix).

For the projection matrix where is matrix, there exists an orthogonal matrix such that

Proof.From , note that we have and since if both and are invertible, we have Thus we have where and This completes the proof.


Iterated Projection

Theorem (iterated projection on range space).

Let . and let . then

Proof.First we prove . where the second equation holds since , and thus .
Next, using Proposition 4 (symmetric and idempotent of projection matrices), where also holds.

Corollary (iterated projection on null space).

Let . and . and

Proof.Similarly, Also, since and holds by Proposition 4 (symmetric and idempotent of projection matrices).


Uniqueness and Linear Transformation

Theorem (uniqueness of projection matrix).

Suppose two different matrices and have the same range space. Then

Proof.From the previous result, and since , we have Therefore, we have .

Theorem (linear transformations of regressor).

Let be a nonsingular . Then for matrix , we have

Proof.We can derive the result directly. This completes the proof.


Projection on Intercept

Let denote a constant term, and let denote a projection matrix on the constant term.

Proposition (constant term averaging).

For given projection matrix on the range of a constant term, maps any vector to a vector of the average. Conversely, demeans a vector.

Proof.Note that , and is vector. Therefore, Also, we have this completes the proof.